Goto

Collaborating Authors

 spectral mixture kernel





Estimating Nonlinear Neural Response Functions using GP Priors and Kronecker Methods

Cristina Savin, Gasper Tkacik

Neural Information Processing Systems

Jointly characterizing neural responses in terms of several external variables promises novel insights into circuit function, but remains computationally prohibitive in practice.


Export Reviews, Discussions, Author Feedback and Meta-Reviews

Neural Information Processing Systems

First provide a summary of the paper, and then address the following criteria: Quality, clarity, originality and significance. The authors combine two recent advances in Gaussian processes, spectral mixture kernels [5], and scalable Gaussian processes for data in grids [14, 22 see below], in order to tackle applications with high amount of data points, like texture extrapolation, inpainting, and video extrapolation. The paper includes a thorough evaluation of the framework proposed, and comparisons against sparse GP methods, with general purpose covariance functions, and spectral mixture kernels. Quality The paper is technically sound. The framework proposed by the authors achieves outstanding results in the different applications studied in the paper.


Export Reviews, Discussions, Author Feedback and Meta-Reviews

Neural Information Processing Systems

Submitted by Assigned_Reviewer_1 Q1 The authors propose a flexible and interpretable kernel (the CSM kernel), building on spectral mixture kernels, for learning relationships between multiple tasks. The starting point is to use Gaussian processes with 1 component spectral mixture kernels as the basis functions in a linear model of coregionalisation (SM-LMC). However, SM-LMC does not contain information about the phases between channels. Thus the authors propose the cross spectral mixture kernel, which mixes phase shifted versions of spectral mixture kernels across channels. The resulting kernel is interpretable and flexible.

  Country: North America > Canada > Quebec > Montreal (0.04)
  Industry: Health & Medicine (0.32)

Marginalised Gaussian Processes with Nested Sampling Fergus Simpson

Neural Information Processing Systems

Gaussian Process models are a rich distribution over functions with inductive biases controlled by a kernel function. Learning occurs through optimisation of the kernel hyperparameters using the marginal likelihood as the objective.


Spectral Mixture Kernels for Bayesian Optimization

Zhang, Yi, Hua, Cheng

arXiv.org Artificial Intelligence

Bayesian Optimization (BO) is a widely used approach for solving expensive black-box optimization tasks. However, selecting an appropriate probabilistic surrogate model remains an important yet challenging problem. In this work, we introduce a novel Gaussian Process (GP)-based BO method that incorporates spectral mixture kernels, derived from spectral densities formed by scale-location mixtures of Cauchy and Gaussian distributions. This method achieves a significant improvement in both efficiency and optimization performance, matching the computational speed of simpler kernels while delivering results that outperform more complex models and automatic BO methods. We provide bounds on the information gain and cumulative regret associated with obtaining the optimum. Extensive numerical experiments demonstrate that our method consistently outperforms existing baselines across a diverse range of synthetic and real-world problems, including both low- and high-dimensional settings.


Fast Kernel Learning for Multidimensional Pattern Extrapolation

Andrew Wilson, Elad Gilboa, John P. Cunningham, Arye Nehorai

Neural Information Processing Systems

The ability to automatically discover patterns and perform extrapolation is an essential quality of intelligent systems. Kernel methods, such as Gaussian processes, have great potential for pattern extrapolation, since the kernel flexibly and interpretably controls the generalisation properties of these methods. However, automatically extrapolating large scale multidimensional patterns is in general difficult, and developing Gaussian process models for this purpose involves several challenges. A vast majority of kernels, and kernel learning methods, currently only succeed in smoothing and interpolation. This difficulty is compounded by the fact that Gaussian processes are typically only tractable for small datasets, and scaling an expressive kernel learning approach poses different challenges than scaling a standard Gaussian process model.


Export Reviews, Discussions, Author Feedback and Meta-Reviews

Neural Information Processing Systems

Rebuttal: thank you for your clarifications. I still think that learning kernel (parameters) from multiple realizations of a GP is not very novel in general, but sufficiently novel in your specific context to get discussed at NIPS. The authors use Gaussian processes to learn human function extrapolation behaviour from human sample data. After a comprehensive literature review, they introduce the main idea of the paper: learn the kernel parameters by maximizing the conditional probability of the extrapolation data given the training data. To allow for flexible kernel shapes, they use spectral mixture kernels.